PrintAction

Features
Analyzing skin tones with AI

The problem with skin tone identification by algorithms

July 16, 2021  By Laura Rendell-Dean


Photo © Yassmine/Adobe Stock

Skin tone identification through artificial intelligence (AI) is used in our everyday lives whether we are aware of it or not. If you are on social media, you are using skin tone recognition algorithms. This kind of intelligence is used for facial recognition, which can be utilized to place silly filters over your face or unlock your phone.

The technology can also be employed to detect nudity in social media posts. When you post a photo to Instagram or Facebook or a lens to Snapchat, the image is automatically analyzed. If a large number of pixels with skin tones are detected, the image is likely to include nudity, and is then flagged for review by a human observer. However, errors can occur. For example, if the graphic contains a large, neutral-coloured item such as a sofa, the AI could slip up and recognize it as a person.

Samples are handy
When it comes to detecting skin tones, an AI algorithm is given plenty of sample images and videos to absorb and develop the ability to detect new images of people. A key component of this process is the ability to detect the amount of skin tone pixels in order to determine the image is of a human being.
Skin tones are generally neutral beige and brown colours that can be easily taken out of images as colour clusters and recognized as humans by the algorithm. AI uses the variation in tones to recognize the distinctive body parts. AI can detect the eyes, nose and mouth of a person who is just looking into the camera on a phone, and then place a multitude of filters—from dog ears to face warping—in real time. As mentioned earlier, it can also identify images of faces and bodies that are posted to social media platforms, and when a predetermined threshold of skin tone pixels is in a specific image or video, the AI will flag it for potential nudity.

Advertisement

Content tracker
An experiment was held by the European Data Journalism Network where they used a content tracker on Instagram accounts of some volunteers to see how often scantily clad men and women would show up at the top of their Instagram explore pages. The data showed it was 54 per cent more likely for images of women in bikinis or undergarments to be boosted to the top over anything else. This content is being targeted and boosted to other people by using AI. The algorithm detects models in the photos are not wearing much clothing. This ‘intelligence’ allows social media platforms to highlight similar content/photos to ensure the targeted user stays longer on the site/app.

Another thing to note is that this algorithm will flag photos of people with larger bodies more consistently than people with lean physique. This is likely due to the amount of skin tone pixels that are detected by the AI. Numerous plus-sized models and influencers have spoken out on this imbalance, as their posts get flagged for nudity, or they are given age restrictions and content warnings. This is either a genuine mistake in the system or thinly veiled fatphobia since larger bodies do not fit into our society’s beauty standards. These images tiptoe the line for what is deemed inappropriate content for the site and what isn’t, and its complications are being exposed (and hopefully fixed) all the time.


Print this page

Advertisement

Stories continue below